Goto

Collaborating Authors

 number concept


A Appendix 458 A.1 Supplemental Results

Neural Information Processing Systems

Figure 1 illustrates model predictions across every Number Game concept in [33].Figure 6: Model predictions across every Number Game concept in [33] (Figure 1). For the number game, every model has its outputs transformed by a learned Platt transform. Logical concept models do not use Platt transforms. We fit these parameters using Adam with a learning rate of 0.001. For the number game we do 10-fold cross validation to calculate holdout predictions.





Human-like Few-Shot Learning via Bayesian Reasoning over Natural Language

Ellis, Kevin

arXiv.org Artificial Intelligence

A core tension in models of concept learning is that the model must carefully balance the tractability of inference against the expressivity of the hypothesis class. Humans, however, can efficiently learn a broad range of concepts. We introduce a model of inductive learning that seeks to be human-like in that sense. It implements a Bayesian reasoning process where a language model first proposes candidate hypotheses expressed in natural language, which are then re-weighed by a prior and a likelihood. By estimating the prior from human data, we can predict human judgments on learning problems involving numbers and sets, spanning concepts that are generative, discriminative, propositional, and higher-order.